-
Notifications
You must be signed in to change notification settings - Fork 5.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Aml 20867 paddle frontend var #20932
Aml 20867 paddle frontend var #20932
Conversation
Hi @hirwa-nshuti , I'm so sorry to bother you, but I see that you worked on the same file before me and that's why I'm asking for your help with this, hope that's ok with you. There are some failing tests and I'd really appreciate your feedback on this, Thank you. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sure no problem I took a look and pointed out the cause of failures hope yyou'll update your PR to get the tests passing 😊
backend_fw, | ||
test_flags, | ||
): | ||
input_dtype, x, axis = dtype_and_x |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The issue is here when unpacking values this helper function returns tuple of 4 variables but you're unpacking only 3
input_dtype, x, axis = dtype_and_x | |
input_dtype, x, axis, _ = dtype_and_x |
@@ -45,3 +45,13 @@ def median(x, axis=None, keepdim=False, name=None): | |||
else ivy.astype(x, ivy.float32) | |||
) | |||
return ivy.median(x, axis=axis, keepdims=keepdim) | |||
|
|||
|
|||
@with_supported_dtypes({"2.5.0 and below": ("float16", "float32", "float64")}, "paddle") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The version value should be 2.5.1
@@ -45,3 +45,13 @@ def median(x, axis=None, keepdim=False, name=None): | |||
else ivy.astype(x, ivy.float32) | |||
) | |||
return ivy.median(x, axis=axis, keepdims=keepdim) | |||
|
|||
|
|||
@with_supported_dtypes({"2.5.0 and below": ("float16", "float32", "float64")}, "paddle") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The version value should be 2.5.1
Thank you very much for your help @hirwa-nshuti , I did the requested changes but I still have failing tests -they are now more than before- so is that ok or do I need to change more things? |
Hello! You don't need to worry about failing tests that are not related to your PR, so you can ignore run_tests (58) and run_tests (75). Regarding your failing tests, I get the following error for all backends:
Honestly, I've never seen that before. @hirwa-nshuti do you know what might be happening? |
hii @ZoeCD @hirwa-nshuti i am sam . i am also working on var function to paddle.Tensor frontend method (#18302). and i am also getting this error. and i found that the error is coming because the backend of the var function is calling the _std function (mean ) and the mean doesn't sport the float16 dtype as it is only sport the float32 and float64 dtype. and i changed the dtype after but still it can't work. i don't know how but when i try the following code it passes all the test. #paddle.tensor.stat.py
@with_unsupported_dtypes({"2.5.1 and below": ("float16", "bfloat16")}, "paddle")
@to_ivy_arrays_and_back
def var(x, axis=None, unbiased=True, keepdim=False, *, out=None):
return ivy.var(x, axis=axis, correction=int(unbiased), keepdims=keepdim, out=out) # var
@handle_frontend_test(
fn_tree="paddle.var",
dtype_and_x=_statistical_dtype_values(
function="var",
min_value=-1e04,
max_value=1e04,
),
keepdims=st.booleans(),
)
def test_paddle_var(
*,
dtype_and_x,
keepdims,
on_device,
fn_tree,
backend_fw,
frontend,
test_flags,
):
input_dtype, x, axis, correction = dtype_and_x
helpers.test_frontend_function(
input_dtypes=input_dtype,
frontend=frontend,
test_flags=test_flags,
fn_tree=fn_tree,
on_device=on_device,
backend_to_test=backend_fw,
x=x[0],
axis=axis,
unbiased=bool(correction),
keepdim=keepdims,
) this is the result |
i think there must a problem with the paddle backend implementation of var function. because i worked on var method on jax frontend there also all the test is passing except the paddle backend. can you please check if the code that i provided is correct implementation then you can merge it so i can call this var in my paddle.Tensor directly and it should work there also. thank you |
Hi @samthakur587 I'm curious about something -I'm still learning about ivy and its structure-, You said that the backend var for paddle is the one which got a problem so why don't we fix this one instead of changing the frontend function? I looked at other var function implementations (TensorFlow, Torch, Jax) to try and learn from them and they seem much more complicated and different than the var backend for paddle, also the PR merge that was responsible for it seems to have a lot of failing tests. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Basically the paddle.var
function doesn't support float16
even it was mentioned here in the docs https://www.paddlepaddle.org.cn/documentation/docs/en/api/paddle/var_en.html#var
import paddle
x = paddle.to_tensor([-1., -1.], dtype="float16")
print(paddle.var(x, axis=None, unbiased=True, keepdim=False, name=None))
will throw runtime error for unsupported dtype
Removing float16
from supported dtypes will fix the issue.
Also resolve the merge conflicts
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM!
Thanks for contributing😊
Co-authored-by: hirwa-nshuti <hirwanshutiflx@gmail.com>
close #20867